Is Generator Conditioning Causally Related to GAN Performance?
نویسندگان
چکیده
Recent work (Pennington et al., 2017) suggests that controlling the entire distribution of Jacobian singular values is an important design consideration in deep learning. Motivated by this, we study the distribution of singular values of the Jacobian of the generator in Generative Adversarial Networks (GANs). We find that this Jacobian generally becomes ill-conditioned at the beginning of training. Moreover, we find that the average (with z ∼ p(z)) conditioning of the generator is highly predictive of two other ad-hoc metrics for measuring the “quality” of trained GANs: the Inception Score and the Frechet Inception Distance (FID). We test the hypothesis that this relationship is causal by proposing a “regularization” technique (called Jacobian Clamping) that softly penalizes the condition number of the generator Jacobian. Jacobian Clamping improves the mean Inception Score and the mean FID for GANs trained on several datasets. It also greatly reduces inter-run variance of the aforementioned scores, addressing (at least partially) one of the main criticisms of GANs.
منابع مشابه
Dihedral angle prediction using generative adversarial networks
Several dihedral angles prediction methods were developed for protein structure prediction and their other applications. However, distribution of predicted angles would not be similar to that of real angles. To address this we employed generative adversarial networks (GAN) which showed promising results in image generation tasks. Generative adversarial networks are composed of two adversarially...
متن کاملInvestigation of STATCOM effects on synchronous generator impedance based LOF relay with considering a realistic model for the excitation system of the generator
This paper studies the effects of static synchronous compensator (STATCOM) on synchronous generator conventional loss of field (LOF) protection. To accomplish a comprehensive study, a typical and realistic excitation system is considered for the generator by using the phase-domain generator model available in the real-time-digital-simulator. Using such a system, LOF phenomenon is realistically ...
متن کاملCausalgan: Learning Causal Implicit Gener-
We introduce causal implicit generative models (CiGMs): models that allow sampling from not only the true observational but also the true interventional distributions. We show that adversarial training can be used to learn a CiGM, if the generator architecture is structured based on a given causal graph. We consider the application of conditional and interventional sampling of face images with ...
متن کاملGlobal versus Localized Generative Adversarial Nets
In this paper, we present a novel localized Generative Adversarial Net (GAN) to learn on the manifold of real data. Compared with the classic GAN that globally parameterizes a manifold, the Localized GAN (LGAN) uses local coordinate charts to parameterize distinct local geometry of how data points can transform at different locations on the manifold. Specifically, around each point there exists...
متن کاملLoss-Sensitive Generative Adversarial Networks on Lipschitz Densities
In this paper, we present a novel Loss-Sensitive GAN (LS-GAN) that learns a loss function to separate generated samplesfrom their real examples. An important property of the LS-GAN is it allows the generator to focus on improving poor data points that are far apart from real examples rather than wasting efforts on those samples that have already been well generated, and thus can improve...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1802.08768 شماره
صفحات -
تاریخ انتشار 2018